![]() MOBILE TERMINAL
专利摘要:
A mobile terminal (100, 200, 300, 400) according to the present invention can provide various user interfaces for modifying an image displayed on a front surface display area (551), a modification function for elements displayed on the area. front surface display (551) and other features using the front surface display area (551) and a side surface display area (552, 554). 公开号:FR3022370A1 申请号:FR1554385 申请日:2015-05-15 公开日:2015-12-18 发明作者:Jongpil Kim;Kyunghee Yoo;Anna Yoo;Seungmin Seen 申请人:LG Electronics Inc; IPC主号:
专利说明:
[0001] MOBILE TERMINAL This application claims priority over 10-2014-0 073 813 filed on June 17, 2014 in Korea, the entire contents of which are hereby incorporated by reference. The present invention relates to a mobile terminal capable of providing various user interfaces using a front surface display area and a side surface display area. The terminals can be generally classified as mobile / portable terminals or as immobile terminals depending on their mobility. Mobile terminals can also be classified as portable terminals or as vehicle-mounted terminals depending on whether a user can directly carry the terminal or not. Mobile terminals have become more and more functional. Examples of such functions include data communications, voice communications, image and video acquisition with a camera, audio recording, playback of music files with a speaker system, and displaying images and videos on a display. Some mobile devices include additional features for playing games, while other terminals are configured as media players. More recently, mobile terminals have been configured to receive broadcast and multicast signals for viewing content such as videos and television programs. Efforts are being made to support and increase the functionality of mobile devices. Such efforts include software and hardware enhancements as well as changes and improvements in the structural components. An object of the present invention is therefore to solve the above-mentioned problem and other problems. Another object of the present invention is to provide a mobile terminal capable of providing various user interfaces using not only a front surface display area acting as the main display, but also a side surface display area. [0002] A mobile terminal according to one aspect of the present invention for providing the other object comprises a display unit incorporating a touch screen function and comprising a front surface display area and a side surface display area; and a controller activating a modification function for an image displayed on the front surface display area in the case where the image is moved beyond the front surface display area in a direction to the area lateral surface display. The scope of applicability of the present application will become more apparent from the detailed description below. Nevertheless, it should be understood that the detailed description and specific examples, while indicating preferred embodiments of the invention, are presented by way of illustration only, since various changes and modifications in the spirit and the scope of the invention will become apparent to those skilled in the art from the present detailed description. The present invention will be more fully understood from the detailed description given hereinafter with reference to the accompanying drawings which are presented by way of illustration only and which are therefore not limiting of the present invention. The present invention will be hereinafter described in detail with reference to the accompanying drawings in which: Fig. 1a is a block diagram of a mobile terminal according to the present disclosure; Figs. 1b and 1c are conceptual views of an example of the mobile terminal in different directions; Fig. 2 is a conceptual view of a deformable mobile terminal according to an alternative embodiment of the present disclosure; Fig. 3 is a conceptual view of a portable mobile terminal according to another alternative embodiment of the present disclosure; Fig. 4 is a conceptual view of a portable mobile terminal according to another alternative embodiment of the present disclosure; FIG. 5 illustrates an example of a mobile terminal according to the present invention comprising a front surface display area and a side surface display area; Figure 6 illustrates a method of displaying a mobile terminal of Figure 5; Fig. 7 illustrates another example of a mobile terminal according to the present invention comprising a front surface display area and a side surface display area; Fig. 8 illustrates a display method for a mobile terminal of Fig. 7; Fig. 9 illustrates exemplary combinations of a front surface display area and side surface display areas operable in a mobile terminal according to the present invention; Fig. 10 is a flowchart illustrating an example of a method of operating a mobile terminal according to the present invention; FIGS. 11 to 13 illustrate exemplary embodiments of a function for modifying an image displayed on a front surface display area according to the operating method of a mobile terminal of FIG. 10; FIG. 14 illustrates another exemplary implementation of a function for modifying an image displayed on a front surface display area according to the operating method of a mobile terminal of FIG. 10; Fig. 15 shows a modified original image according to an image modification method of Fig. 16 and an image obtained as a result of modification; Fig. 16 is a flowchart illustrating another example of a method of operating a mobile terminal according to the present invention; Fig. 17 is a flowchart illustrating another example of a method of operating a mobile terminal according to the present invention; FIGS. 18 to 20 illustrate examples of the execution of an element modification function according to the operating method of a mobile terminal of FIG. 19; Fig. 21 shows examples of creating a new image file based on the image modification method illustrated in Figs. 18-20; FIG. 22 illustrates another exemplary embodiment of an element modification function according to the operating method of a mobile terminal of FIG. 17; FIG. 23 illustrates another exemplary embodiment of an element modification function according to the operating method of a mobile terminal of FIG. 17; FIG. 24 illustrates another exemplary embodiment of an element modification function according to the operating method of a mobile terminal of FIG. 17; Fig. 25 illustrates another example of an element modification method performed in a mobile terminal according to the present invention; Fig. 26 illustrates another example of an element modification method performed in a mobile terminal according to the present invention; Fig. 27 is a flowchart illustrating another example of a method of operating a mobile terminal according to the present invention; FIG. 28 illustrates an exemplary execution of a function for modifying an element displayed on a front surface display area according to the operating method of a mobile terminal of FIG. 27. The present invention will be described in detail below according to exemplary embodiments disclosed herein, with reference to the accompanying drawings. For the sake of brevity of the description with reference to the accompanying drawings, the same or equivalent components may have the same or similar reference numerals and their description is not repeated. In general, a suffix such as "module" or "unit" can be used to refer to elements or components. The use of such a suffix herein is simply to facilitate the description of the memoir, and the suffix itself is not intended to provide a special meaning or function. In the present disclosure, what is well known to those skilled in the art is generally omitted for the sake of brevity. The accompanying drawings are used to facilitate understanding of various technical features. It should be appreciated that the embodiments described herein are not limited by the accompanying drawings. As such, the present disclosure should be understood to encompass all alterations, equivalencies and substitutions in addition to those particularly exemplified in the accompanying drawings. It must be understood that although the terms "first," "second," and so on. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are generally only used to distinguish one element from another element. It should be understood that when an element is described as being "connected to" another element, the element may be connected to the other element or intervening elements may also be present. In contrast, when an element is described as being "directly connected" to another element, no element of intervention is present. A singular representation may include a plural representation unless it changes the meaning from the context. Terms such as "understand" or "include" are used herein and should be construed as indicating the existence of several components, functions or steps disclosed herein. It should also be understood that more or fewer components, functions or steps can be similarly used. [0003] The mobile terminals described herein can be implemented using a variety of different types of terminals. Examples of such terminals include cell phones, smartphones, user equipment, laptops, digital broadcast terminals, personal digital assistants (PDAs), portable media players (PMPs), browsers, laptops ( PCs), slab PCs, tablet PCs, ultra books, portable devices (eg, smart watches, smart glasses, head-mounted displays (HMDs)) and the like. As a nonlimiting example only, particular types of mobile terminals will be described in detail below. Nevertheless, such teachings also apply to other types of terminals, such as the aforementioned types. In addition, these teachings can also be applied to immobile terminals, such as digital TVs, desktops and the like. [0004] Reference will be made below to FIGS. 1 to 3, in which FIG. 1 is a block diagram of a mobile terminal according to the present disclosure, and FIGS. 2 and 3 are conceptual views of an example of the mobile terminal. , in view in different senses. [0005] The mobile terminal 100 shown comprises components such as a wireless communication unit 110, an input unit 120, a detection unit 140, an output unit 150, an interface unit 160, a memory 170, a storage unit 180, and a power supply unit 190. It should be understood that the implementation of all the illustrated components is not a requirement and that more or fewer components can be implemented alternatively. With reference to Figure la, the mobile terminal 100 is shown as having a wireless communication unit 110 configured with several components currently implemented. For example, the wireless communication unit 110 typically includes one or more components that allow wireless communications between the mobile terminal 100 and a wireless communication system or network in which the mobile terminal is located. The wireless communication unit 110 generally includes one or more modules that enable communications such as wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal, or communications between the mobile terminal 100 and an external server. In addition, the wireless communication unit 110 generally comprises one or more modules that connect the mobile terminal 100 to one or more networks. To facilitate such communications, the wireless communication unit 110 comprises one or more of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, Short-range communication module 114 and location information module 115. Input unit 120 includes camera 121 for obtaining images or video, microphone 122, which is a type of device audio input to input an audio signal, and a user input unit 123 (for example, a touch key, a push key, a mechanical key, a function key, and the like) to enable a user to enter information. Data (e.g., audio, video, image, and the like) is obtained by the input unit 120 and can be analyzed and processed by the controller 180 according to device parameters, control commands, and the like. user and combinations thereof. [0006] The detection unit 140 is generally implemented using one or more sensors configured to detect internal information of the mobile terminal, the mobile terminal environment, user information and the like. For example, in FIG. 1a, it is shown that the detection unit 140 comprises a proximity sensor 141 and a lighting sensor 142. If desired, the detection unit 140 may alternatively or in addition , include other types of sensors or devices, such as a touch sensor, an acceleration sensor, a magnetic sensor, a gravity sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared sensor ( IR), a fingerprint sensor, an ultrasound sensor, an optical sensor (e.g., a camera 121), a microphone 122, a battery meter, an environmental sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detection sensor, a thermal sensor and a gas sensor, among others), and a chemical sensor (e.g., an electronic nose, a health sensor, a biometric sensor, and the like) ), to name a few ns. The mobile terminal 100 may be configured to use information obtained from the detection unit 140, and in particular, information obtained from one or more sensors of the detection unit 140, and combinations of those -this. [0007] The output unit 150 is generally configured to output various types of information, such as audio, video, and touch outputs and the like. It is shown that the output unit 150 includes a display unit 151, an audio output module 152, a haptic module 153, and an optical output module 154. [0008] The display unit 151 may comprise a laminated structure or an integrated structure with a touch sensor to facilitate a touch screen. The touch screen can provide an output interface between the mobile terminal 100 and a user, and it can also function as a user input unit 123 which provides an input interface between the mobile terminal 100 and the user. 'user. [0009] The interface unit 160 serves as an interface with various types of external devices that can be coupled to the mobile terminal 100. The interface unit 160 may for example comprise wired or wireless ports, power supply ports. external, wired or wireless data ports, memory card ports, ports for connecting a device with an identification module, audio I / O ports, I / O ports video, headphone ports and similar items. In some cases, the mobile terminal 100 may perform matching control functions associated with an attached external device, in response to the link between the external device and the interface unit 160. The memory 170 is generally implemented to memorize data to support various functions or features of the mobile terminal 100. For example, the memory 170 may be configured to store application programs executed in the mobile terminal 100, data or instructions for operations of the mobile terminal 100 , and similar elements. Some of these application programs can be downloaded from an external server via wireless communication. Other application programs may be installed in the mobile terminal 100 at the time of manufacture or shipment, which is generally the case for basic functions of the mobile terminal 100 (for example, receiving a call, passing a call, receive a message, send a message, and similar items). It is common for the application programs to be stored in the memory 170, installed in the mobile terminal 100, and executed by the controller 180 to perform an operation (or function) for the mobile terminal 100. [0010] The controller 180 generally operates to control the overall operation of the mobile terminal 100, in addition to the operations associated with the application programs. The controller 180 may provide or process information or functions appropriate for a user by processing signals, data, information and the like that are input or output from the various components shown in FIG. 1, or by activating application programs stored in the memory 170. In one example, the controller 180 controls some or all of the components illustrated in FIGS. 1a-1c according to the execution of an application program which has been stored in the memory 170. [0011] The power supply unit 190 may be configured to receive external power or provide internal power to supply the appropriate power necessary for the operation of the elements and components included in the mobile terminal 100. The power supply unit electrical 190 may include a battery, and the battery may be configured to be integrated with the terminal body, or configured to be detachable from the terminal body. With reference to Figure la, various components shown in this figure will be described in detail below. With respect to the wireless communication unit 110, the broadcast receiving module 111 is generally configured to receive a broadcast signal and / or information associated with a broadcast from an external broadcast management entity. via a broadcast channel. The broadcast channel may include a satellite channel, a terrestrial channel, or both. In some embodiments, two or more broadcast receiving modules 111 may be used to facilitate the simultaneous reception of two or more broadcast channels, or to support switching between broadcast channels. The mobile communication module 112 may transmit and / or receive wireless signals to and / or from one or more network entities. Typical examples of a network entity are a base station, an external mobile terminal, a server, and the like. Such network entities are part of a mobile communication network that is built according to technical standards or communication methods for mobile communications (eg, global mobile communication system (GSM), code division multiple access ( CDMA), code division multiple access (CDMA2000), optimized enhanced voice-data or enhanced voice-data only (EV-DO), broadband CDMA (WCDMA), high-speed downlink packet access (HSDPA) ), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), Advanced Long Term Evolution (LTE-A), and similar elements). Examples of wireless signals transmitted and / or received via the mobile communication module 112 include audio calling signals, video calling (telephony) signals, or various data formats for supporting communications. texts and multimedia messages. [0012] The wireless Internet module 113 is configured to facilitate access to the wireless Internet. This module can be coupled, internally or externally, to the mobile terminal 100. The wireless Internet module 113 can transmit and / or receive wireless signals via communication networks according to wireless Internet technologies. Examples of such wireless Internet access are in particular wireless local area network (WLAN), wireless fidelity (Wi-Fi), Wi-Fi Direct, digital living network alliance (DLNA), broadband without wireless (WiBro), global interoperability for microwave access (WiMAX), high speed downlink packet access (HSDPA), high speed uplink packet access (HSUPA), long term evolution (LTE) ), advanced long-term evolution (LTE-A) and similar elements. The wireless Internet module 113 may transmit / receive data based on one or more of these wireless Internet technologies as well as other Internet technologies. In some embodiments, when wireless Internet access is implemented depending on, for example, WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE-A and the like, in the context of a mobile communication network, the wireless Internet module 113 performs such wireless Internet access. As such, the Internet module 113 can cooperate with or operate as the mobile communication module 112. The short-range communication module 114 is configured to facilitate short-range communications. Suitable technologies for implementing such short-range communications are in particular BLUETOOTHTm, radio frequency identification (RFID), infrared (IrDA) data association, ultra wideband (UWB), ZigBee, near field communication (NFC), wireless fidelity (Wi-Fi), Wi-Fi Direct, universal wireless serial bus (USB wireless), and similar items. In general, the short-range communication module 114 supports wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal 100, or communications between the mobile terminal and a network in which there is another mobile terminal 100 (or an external server), via wireless networks. An example of a wireless network is a wireless personal area network. In some embodiments, another mobile terminal (which may be configured similarly to the mobile terminal 100) may be a portable device, for example a smart watch, smart glasses, or a head-mounted display (HMD), which is capable of exchanging data with the mobile terminal 100 (or cooperating otherwise with the mobile terminal 100). The short-range communication module 114 can detect or recognize the portable device, and allow communication between the portable device and the mobile terminal 100. [0013] In addition, when the detected portable device is a device that is authenticated to communicate with the mobile terminal 100, the controller 180, for example, may cause a processed data transmission in the mobile terminal 100 to the portable device by the device. 114. A user of the portable device can then use the processed data in the mobile terminal 100 on the portable device. For example, when a call is received in the mobile terminal 100, the user can answer the call using the portable device. Similarly, when a message is received in the mobile terminal 100, the user can view the received message using the portable device. The location information module 115 is generally configured to detect, calculate, derive or otherwise identify a position of the mobile terminal. In one example, the location information module 115 includes a global positioning system (GPS) module, a Wi-Fi module, or both. If desired, the location information module 115 may alternatively or additionally operate with one or more other modules of the wireless communication unit 110 to obtain data relating to the position of the mobile terminal. In one example, when the mobile terminal uses a GPS module, a position of the mobile terminal can be acquired using a signal sent from a GPS satellite. In another example, when the mobile terminal uses the Wi-Fi module, a position of the mobile terminal can be acquired on the basis of information relating to a wireless access point (AP) which transmits or receives a signal without The input unit 120 may be configured to allow various types of input to the mobile terminal 120. Examples of such an input include an audio input, an input, and an input. image, a video input, a data input and a user input. An image input and a video input are often obtained using one or more cameras 121. Such cameras 121 can process frames of photos or video images obtained by image sensors in an image acquisition mode. video or picture. The processed image frames may be displayed on the display unit 151 or stored in the memory 170. In some cases, the cameras 121 may be arranged in a matrix configuration to allow a plurality of images having various angles or Various focal points are inputted to the mobile terminal 100. In another example, the cameras 121 may be located in a stereoscopic arrangement to acquire left and right images for implementing a stereoscopic image. The microphone 122 is generally implemented to allow audio input into the mobile terminal 100. The audio input may be processed in a variety of ways according to a function performed in the mobile terminal 100. If desired, the microphone 122 may include matching noise canceling algorithms to suppress unwanted noise generated during the reception of external audio. The user input unit 123 is a component that allows a user to make an entry. Such user input may allow the controller 180 to control the operation of the mobile terminal 100. The user input unit 123 may include one or more of a mechanical input element (e.g. , a key, a button located on a front surface and / or a rear surface or a side surface of the mobile terminal 100, a dome-shaped switch, a scroll wheel, a wheel and the like), or an input touch, among others. In one example, the touch input can be a virtual key or a function key, which is displayed on a touch screen via software processing, or a touch key that is located on the mobile terminal at a location other than the touch screen. Moreover, the virtual key or the visual key can be displayed on the touch screen in various forms, for example graphic, text, icon, video or a combination of these. The detection unit 140 is generally configured to detect one or more of the internal information of the mobile terminal, mobile terminal environment information, user information, or the like. The controller 180 generally cooperates with the sending unit 140 to control the operation of the mobile terminal 100 or to execute a data processing, a function or an operation associated with an application program installed in the mobile terminal on the Based on the detection provided by the detection unit 140. The detection unit 140 may be implemented using any of a variety of sensors, some of which will be described in detail hereinafter. The proximity sensor 141 may comprise a sensor for detecting the presence or absence of an object approaching a surface, or an object located near a surface, using a magnetic field, infrared rays or similar elements without mechanical contact. The proximity sensor 141 may be arranged at an inner region of the mobile terminal covered by the touch screen, or near the touch screen. [0014] The proximity sensor 141 may for example comprise any one of a transmissive type photoelectric sensor, a direct reflection type photoelectric sensor, a mirror reflection type photoelectric sensor, a high oscillation proximity sensor. frequency, a capacitance type proximity sensor, a magnetic type proximity sensor, an infrared proximity sensor, and the like. When the touch screen is implemented as a capacitance type, the proximity sensor 141 can detect the proximity of a pointer to the touch screen by changes in an electromagnetic field in response to an approach to the touch screen. an object with a conductivity. In this case, the touch screen (touch sensor) can also be categorized as a proximity sensor. In the present, reference is often made to the term "proximity touch" to indicate a scenario in which a pointer is positioned to be close to the touch screen without touching it. In the present, reference is often made to the term "contact touch" to indicate a scenario in which a pointer is in physical contact with the touch screen. For the position corresponding to the proximity touch of the pointer relative to the touch screen, such a position corresponds to a position at which the pointer is perpendicular to the touch screen. Proximity sensor 141 can detect a proximity touch, and proximity touch patterns (e.g., distance, direction, speed, time, position, travel status, and the like). In general, the controller 180 processes data corresponding to proximity taps and proximity tap patterns detected by the proximity sensor 141, and causes visual information to be output to the touch screen. In addition, the controller 180 may control the mobile terminal 100 to perform different operations or to process different data depending on whether a touch at a point on the touch screen is a proximity touch or a touch. [0015] A touch sensor can detect a touch applied to the touch screen, such as a display unit 151, using any of a variety of touch methods. Examples of such tactile methods are a resistive type, a capacitive type, an infrared type and a magnetic field type, among others. In one example, the touch sensor may be configured to convert pressure changes applied to a specific portion of the display unit 151, or convert a capacitance occurring at a specific portion of the display unit 151 to electrical input signals. The touch sensor can also be configured to detect not only an affected position and an affected area, but also touch pressure and / or touch ability. A touch object is generally used to apply a touch input to the touch sensor. Examples of typical touch objects are a finger, a touch pen, a stylus, a pointer, or the like. When a touch input is detected by a touch sensor, corresponding signals can be transmitted to a touch controller. The touch controller can process the received signals and then transmit corresponding data to the controller 180. The controller 180 can therefore detect the region of the display unit 151 that has been touched. Here, the touch controller may be a separate component of the controller 180, the controller 180 and combinations thereof. In some embodiments, the controller 180 may perform the same or different commands depending on a type of touch object that touches the touch screen or a touch key provided in addition to the touch screen. It can be decided whether to execute the same command or a different command depending on the object that provides tactile input, based, for example, on a current state of operation of the mobile terminal 100 or a program. running application. The touch sensor and the proximity sensor can be used individually or in combination to detect various types of touch. Such touches include a short touch (or a tap), a long touch, a multi-touch, a slip feel, a touch of movement, a nip pinch feel, a widening pinch feel, a swipe feel , a touch of passage and similar elements. [0016] If desired, an ultrasound sensor may be implemented to recognize positional information relating to a touch object using ultrasonic waves. The controller 180 may for example calculate a position of a wave generation source based on information detected by an illumination sensor and a plurality of ultrasonic sensors. Since light 10 is much faster than ultrasonic waves, the time it takes for light to reach the optical sensor is much shorter than the time it takes the ultrasonic waves to reach the ultrasonic sensor. The position of the wave generation source can be calculated using this fact. For example, the position of the wave generation source can be calculated using the time difference from the time it takes the ultrasonic wave to reach the sensor based on the light as a signal reference. The camera 121 generally comprises at least one camera sensor (CCD, CMOS, etc.), a photosensor (or image sensors) and a laser sensor. The implementation of the camera 121 with a laser sensor may allow the detection of a touch of a physical object with respect to a 3D stereoscopic image. The photosensor may be laminated or superimposed on the display. The photosensor can be configured to scan the movement of the physical object near the touch screen. In detail, the photosensor may include photodiodes and transistors in rows and columns to scan the content received at the photosensor using an electrical signal that changes as a function of the amount of light applied. The photosensor can calculate the coordinates of the physical object according to the variation of the light to obtain position information of the physical object. The display unit 151 is generally configured to output information processed in the mobile poll 100. For example, the display unit 151 may display run screen information of an application program. executing at the mobile terminal 100 or the user interface (UI) and the graphical user interface (GUI) information in response to the execution screen information. In some embodiments, the display unit 151 may be implemented as a stereoscopic display unit for displaying stereoscopic images. A typical stereoscopic display unit may employ a stereoscopic display scheme such as a stereoscopic schema (eyeglass scheme), an autostereoscopic schema (eyeglass-free schema), a projection scheme (holographic scheme), or a similar scheme. In general, a 3D stereoscopic image may include a left image (e.g., an image of the left eye) and a right image (e.g., an image of the right eye). According to the combination of the left image and the right image in a 3D stereoscopic image, a 3D stereoscopic imaging process can be divided into a top-down process in which the left image and the image on the right and on the left and right in a frame, a process from left to right or from one side to another in which the image on the left and the image on the right are located on the left and on the right in a frame, a checkerboard process in which fragments of the image on the left and the image on the right are in the form of a mosaic, an interlacing process in which the image on the left and the image on the right are located alternately in columns or rows, and a time-sequential method (or frame-by-frame) in which the left image and the right-hand image are alternately displayed in time. Similarly, for a 3D thumbnail image, a left image thumbnail and a right thumbnail image can be generated respectively from a left image and a right image from a dotted image. original image, and then combined to generate a single 3D vignette image. In general, the term "thumbnail" can be used to refer to a reduced image or a reduced still image. A left image thumbnail generated and a generated right image thumbnail can be displayed with a difference in horizontal distance between them by a depth corresponding to the disparity between the image on the left and the image from right to left. screen, which gives a sense of stereoscopic space. A left image and a right image necessary to implement a 3D stereoscopic image can be displayed on the stereoscopic display unit using a stereoscopic processing unit. The stereoscopic processing unit can receive the 3D image and extract the left image and the right image, or it can receive the 2D image and change it into a left image and a right image. The audio output module 152 is generally configured to output audio data. Such audio data can be obtained from any one of a number of different sources, so that the audio data can be received from the wireless communication unit 110 or can be stored in the memory 170. The audio data may be delivered in modes such as a signal receiving mode, a calling mode, a recording mode, a voice recognition mode, a broadcast receiving mode or a similar mode. The audio output module 152 may provide an audible output relating to a particular function (eg, a call signal receiving sound, a message receiving sound, etc.) performed by the mobile terminal 100. audio output 152 may also be implemented in the form of a receiver, a speaker, a buzzer or the like. A haptic module 153 may be configured to generate various tactile effects that a user feels, perceives or otherwise receives. A typical example of a tactile effect generated by the haptic module 153 is a vibration. The force, pattern, and other similar characteristics of the vibration generated by the haptic module 153 may be controlled by a user selection or may be adjusted by the controller. For example, the haptic module 153 can deliver different vibrations in a combined manner or in a sequential manner. In addition to vibrations, the haptic module 153 may generate various other tactile effects, including a pacing effect such as a vertically moving pin arrangement for contacting the skin, a spraying force, or a suction force. air through a jet orifice or a suction opening, a touch of the skin, an electrode contact, an electrostatic force, an effect by reproducing the sense of cold and heat by using an element that can absorb or generate heat, and similar elements. [0017] The haptic module 153 can also be implemented to allow the user to feel a tactile effect through a muscular sensation, such as the fingers or the arm of the user, and transferring the tactile effect by direct contact. Two or more haptic modules 153 may be provided according to the particular configuration of the mobile terminal 100. An optical output module 154 may output a signal to indicate event generation using light from a light source. Examples of events generated in the mobile terminal 100 may include message reception, call waiting reception, missed call, alarm, scheduling notification, e-mail reception information via an application, and similar elements. A signal delivered by the optical output module 154 may be implemented in such a manner that the mobile terminal emits monochromatic light or light with a plurality of colors. The signal output may be terminated when the mobile terminal detects that a user has viewed the generated event, for example. The interface unit 160 serves as an interface for external devices to connect to the mobile terminal 100. For example, the interface unit 160 can receive data transmitted from an external device, receive an energy to be transmitted to the device. elements and components within the mobile terminal 100, or transmit internal data of the mobile terminal 100 to such an external device. The interface unit 160 may comprise wired or wireless headphone ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, audio input / output (I / O) ports, video I / O ports, headphone ports, or the like. The identification module may be a chip that stores various information to authenticate the authority to use the mobile terminal 100 and may include a user identity module (UIM), a subscriber identity module ( SIM), a Universal Subscriber Identity Module (USIIVI) and similar items. In addition, the device comprising the identification module (referred to hereinafter as "identification device") may take the form of a smart card. Therefore, the identification device can be connected to the terminal 100 via the interface unit 160. When the mobile terminal 100 is connected to an external docking station, the interface unit 160 can serve as a gateway for providing power from the docking station to the mobile terminal 100 or it can be used as a gateway to allow the transfer of various control signals entered by the user from the docking station to the mobile terminal . Various control signals or the power supply of the docking station may function as signals to recognize that the mobile terminal is properly mounted on the docking station. The memory 170 may store programs to support operations of the controller 180 and store input / output data (eg a directory, messages, photos, videos, etc.). The memory 170 can store data relating to various vibration and audio patterns that are delivered in response to touch inputs on the touch screen. The memory 170 may comprise one or more types of storage media, including a flash memory, a hard disk, a solid state disk, a semiconductor disk, a multimedia microcard, a card memory (e.g. , SD or DX memory, etc.), random access memory (RAM), static random access memory (SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), a magnetic memory, a magnetic disk, an optical disk and the like. The mobile terminal 100 may also operate in connection with a network storage device which performs the function of storing memory 170 on a network, such as the Internet. The controller 180 may generally control the general operations of the mobile terminal 100. For example, the controller 180 may activate or deactivate a lockout state to prohibit a user from entering a control command with respect to relates to applications when a mobile terminal status meets a preset condition. The controller 180 may also perform the control and processing associated with voice calls, data communications, video calls, and the like, or perform pattern recognition processing to recognize a handwritten input or an input. drawing on the touch screen respectively in the form of characters or images. In addition, the controller 180 may control a component or combination of these components to implement various exemplary embodiments disclosed herein. The power supply unit 190 receives external energy or provides internal energy and supplies the appropriate energy necessary for the operation of respective elements and components included in the mobile terminal 100. The power supply unit 190 may comprise a battery, which is generally rechargeable or releasably coupled to the terminal body for charging. The power supply unit 190 may include a link port. The link port may be configured as an example of the interface unit 160 to which an external charger is electrically connected to provide the energy required to recharge the battery. In another example, the power supply unit 190 may be configured to recharge the battery in a wireless manner without using the link port. In this example, the power supply unit 190 can receive energy transferred by an external wireless energy transmitter, using at least one of an inductive coupling method which is based on induction magnetic or a magnetic resonance coupling method that is based on magnetic resonance. Various embodiments described herein may be implemented in a computer readable medium, a machine readable medium or a similar medium using for example software, hardware or a combination thereof. With reference to FIGS. 1b and 1c, the mobile terminal 100 will be described hereinafter with respect to a bar-type terminal body. Nevertheless, the mobile terminal 100 may alternatively be implemented in various configurations. Examples of such configurations are in particular a watch, a clip, glasses, a folding type, a folding type, a sliding type, a tilt type, and a pivot type in which two or more bodies are combined so as to ability to move relative to each other, and combinations thereof. The following description often focuses on a particular type of mobile terminal (for example, a bar type, a watch, glasses, and similar types). Nevertheless, such teachings concerning a particular type of mobile terminal also apply in general to other types of mobile terminals. [0018] The mobile terminal 100 generally comprises a housing (for example, a chassis, a housing, a cover, and the like) forming the appearance of the terminal. In the present embodiment, the housing is constituted using a front housing 101 and a rear housing 102. Various electronic components are incorporated in a space formed between the front housing 101 and the rear housing 102. At least one central housing can in addition to being positioned between the front housing 101 and the rear housing 102. The display unit 151 is shown located on the front side of the terminal body for providing information. As illustrated, a window 151a of the display unit 151 may be mounted on the front housing 101 to form the front surface of the terminal body with the front housing 101. In some embodiments, electronic components may Also, examples of such electronic components include a detachable battery 191, an identification module, a memory card, and the like. A rear cover 103 is shown covering the electronic components, and this cover can be releasably coupled to the rear housing 102. Therefore, when the rear cover 103 is detached from the rear housing 102, the electronic components mounted in the rear housing 102 are exposed to the outside. As illustrated, when the rear cover 103 is coupled to the rear housing 102, a side surface of the rear housing 102 is partially exposed. In some cases, during coupling, the back box 102 may also be completely protected by the back cover 103. In some embodiments, the back cover 103 may include an opening for exposing a camera 121b or a module to the outside. 152b audio output. The housings 101, 102, 103 may be formed by injection molding of synthetic resin or they may be made of a metal, for example stainless steel (STS), aluminum (Al), titanium (Ti ) or similar elements. In an alternative to the example in which the plurality of housings constitute an interior space for receiving components, the mobile terminal 100 may be configured such that a housing forms the interior space. In this example, a mobile terminal 100 having a single body is formed such that a synthetic resin or metal extends from a side surface to a back surface. If desired, the mobile terminal 100 may include a watertight unit (not shown) to prevent water from entering the terminal body. For example, the watertight unit may include a watertight member that is located between the window 151a and the front housing 101, between the front housing 101 and the rear housing 102, or between the rear housing 102. and the back cover 103, for hermetically sealing an interior space when these housings are coupled. [0019] Figures lb and lc show certain components arranged on the mobile terminal. Nevertheless, it should be understood that alternative arrangements are possible within the scope of the teachings of the present disclosure. Some components may be omitted or arranged differently. For example, the first handling unit 123a may be located on another surface of the terminal body, and the second audio output module 152b may be located on the side surface of the terminal body. The display unit 151 outputs processed information in the mobile terminal 100. The display unit 151 may be implemented using one or more appropriate display devices. Examples of such suitable display devices include a liquid crystal display (LCD), a thin film transistor (TFT-LCD) liquid crystal display, an organic light-emitting diode (OLED), a flexible display, a display three-dimensional (3D), an electronic ink display and combinations thereof. The display unit 151 may be implemented using two display devices, which may employ the same display technology or different display technologies. For example, a plurality of display units 151 may be arranged on one side being spaced from each other or being integrated with each other, or these devices may be arranged on surfaces 30 different. The display unit 151 may also include a touch sensor that detects a touch input received at the display unit. When a touch is input to the display unit 151, the touch sensor can be configured to detect that touch and the controller 180, for example, can generate a control command or other signal corresponding to the touch. The content that is entered by touch can be a text or numeric value, or a menu item that can be specified or designated in various modes. [0020] The touch sensor may be configured as a film having a tactile pattern, disposed between the window 151a and a display on a rear surface of the window 151a, or a wire which is structured directly on the rear surface of the window 151a. Alternatively, the touch sensor can be an integral part of the display. For example, the touch sensor may be disposed on a substrate of the display or within the display. The display unit 151 may also constitute a touch screen with the touch sensor. Here, the touch screen can serve as a user input unit 123 (see Figure 1). Therefore, the touch screen can replace at least some of the functions of the first handling unit 123a. [0021] The first audio output module 152a may be implemented as a speaker for delivering voice audio, alarm sounds, multimedia audio reproduction, and the like. The window 151a of the display unit 151 generally includes an aperture to allow passage of the audio generated by the first audio output module 152a. An alternative is to pass the audio along an assembly gap between the structural bodies (for example, a spacing between the window 151a and the front housing 101). In this case, an independently formed hole for delivering audio sounds is not necessarily visible or otherwise hidden in appearance, further simplifying the appearance and fabrication of the mobile terminal 100. The optical output module 154 can be configured to output light to indicate event generation. Examples of such events include message reception, call waiting reception, missed call, alarm, scheduling notification, e-mail reception, receipt of information via an application, and similar elements. When a user has viewed a generated event, the controller may command the optical output unit 154 to turn off the light output. [0022] The first camera 121a can process frames of images as still or moving images obtained by the image sensor in an acquisition mode or in a video call mode. The image frames processed can then be displayed on the display unit 151 or stored in the memory 170. [0023] The first and second handling units 123a and 123b are examples of the user input unit 123, which may be manipulated by a user to provide an input to the mobile terminal 100. It may also be commonly referred to the first and second handling units 123a and 123b as a handling portion. The first and second handling units 123a and 123b may employ any tactile method that allows the user to perform manipulation such as touching, pushing, scrolling or similar manipulation. The first and second handling units 123a and 123b may also employ any non-tactile method that allows the user to perform manipulation such as proximity touching, moving, or similar manipulation. Figure 1b illustrates the first handling unit 123a as a touch key, but possible alternatives include a mechanical key, a push key, a touch key, and combinations thereof. Inputs received at the first and second handling units 123a and 123b can be used in a variety of ways. For example, the first handling unit 123a may be used by the user to provide menu entry, a return key at the beginning, a cancel, a search, or the like, and the second handling unit 123b may be used by the user to provide an input for controlling a volume level output from the first or second audio output module 152a or 152, to switch to a touch recognition mode of the display unit 151 or similar elements . In another example of the user input unit 123, a rear input unit (not shown) may be located on the rear surface of the terminal body. The rear input unit may be manipulated by a user to provide an input to the mobile tellninal 100. The input may be used in a variety of different ways. For example, the rear input unit may be used by the user to provide an input for power on / off, start, end, scroll, volume level adjustment delivered by the first or the second audio output module 152a or 152b, the switch to a touch recognition mode of the display unit 151, and the like. The rear input unit may be configured to allow touch input, push input, or combinations thereof. The rear input unit may be located to overlap the display unit 151 on the front side in the thickness direction of the terminal body. In one example, the rear input unit may be located on an upper end portion of the rear side of the terminal body so that a user can easily manipulate it with the index when the user takes the body of the terminal. terminal of a hand. Alternatively, the rear input unit may be positioned at virtually any location on the rear side of the terminal body. Embodiments comprising the rear input unit may implement some or all of the features of the first handling unit 123a in the rear input unit. In situations in which the first handling unit 123a is omitted from the front side, the display unit 151 may have a larger screen. In another variant, the mobile terminal 100 may comprise a fingerprint sensor that scans the fingerprint of a user. The controller 180 can then use fingerprint information detected by the fingerprint sensor as part of an authentication procedure. The fingerprint sensor may also be installed in the display unit 151 or may be implemented in the user input unit 123. The microphone 122 is shown at one end of the mobile terminal 100, but other locations are possible. If desired, a plurality of microphones may be implemented in an arrangement for receiving stereo sounds. The interface unit 160 may serve as a path for the mobile terminal 100 to interface with external devices. For example, the interface unit 160 may include one or more of a link terminal for connection to another device (e.g., headphones, an external speaker, or the like), a port for an near-field communication (for example, an Infrared Data Association (IrDA) port, a Bluetooth port, a wireless LAN port and the like), or a power supply terminal to provide power to the terminal The interface unit 160 may be implemented in the form of a housing for receiving an external card, such as a subscriber identification module (SIIVI), a user identity module (FIG. UIM) or a memory card for storing information. The second camera 12 lb is shown located on the rear side of the terminal body and includes an image acquisition direction which is substantially opposite to the image acquisition direction of the first camera unit 121a. If desired, the second camera 121a may be alternatively located at other locations or may be made movable to have a different image acquisition direction than the one shown. The second camera 121b may include a plurality of lenses arranged along at least one line. The plurality of lenses may also be arranged in a matrix configuration. Cameras can be referred to as "network cameras". When the second camera 121b is implemented as a network camera, images can be acquired in various ways using the plurality of lenses and the images are of better quality. As shown in Fig. 1c, a flash 124 is adjacent to the second camera 121b. When an image of a subject is acquired with the camera 121b, the flash 124 may illuminate the subject. As shown in Fig. 1b, the second audio output module 152b may be located on the terminal body. The second audio output module 152b may implement stereophonic sound functions in connection with the first audio output module 152a, and it may also be used to implement a speaker mode for call communication. . At least one antenna for wireless communication may be located on the terminal body. The antenna can be installed in the terminal body or formed by the housing. For example, an antenna that configures a portion of the broadcast receiving module 111 may be retractable into the terminal body. Alternatively, an antenna may be formed using a film attached to an inner surface of the back cover 103, or a housing that includes a conductive material. [0024] A power supply unit 190 for supplying power to the mobile terminal 100 may include a battery 191, which is mounted in the terminal body or is detachably coupled outside the terminal body. The battery 191 can receive power via a power source cable connected to the interface unit 160. Similarly, the battery 191 can be recharged in a wireless manner using a wireless charger. Wireless charging can be implemented by magnetic induction or electromagnetic resonance. The rear cover 103 is shown coupled to the rear housing 102 to shield the battery 191, to prevent separation of the battery 191 and to protect the battery 191 against external impact or foreign objects. When the battery 191 is detachable from the terminal body, the rear cover 103 may be releasably coupled to the rear housing 102. An accessory to protect an appearance or to assist or expand the functions of the mobile terminal 100 may also be provided on the mobile terminal 100. An example of an accessory is a case or a facade for covering or receiving at least one surface of the mobile terminal 100. The case or facade can cooperate with the display unit 151 to extend the function of the terminal mobile 100. Another example of an accessory is a touch pen to help or extend a touch input on a touch screen. Fig. 2 is a conceptual view of a deformable mobile terminal according to an alternative embodiment of the present invention. In this figure, it is shown that the mobile terminal 200 includes a display unit 251, which is a type of display deformable by an external force. This deformation, which includes the display unit 251 and other components of the mobile terminal 200, may include a curvature, a fold, a flap, a twist, a winding, and combinations thereof. Reference can also be made to the deformable display unit 251 as a flexible display unit. In some implementations, the soft display unit 251 may include a general soft display, electronic paper (also called e-paper), and combinations thereof. In general, the mobile terminal 200 may be configured to include features identical or similar to those of the mobile terminal 100 of Figs. 1-3. [0025] The flexible display of the mobile terminal 200 is generally a lightweight, non-fragile display formed having features of a conventional flat display, but manufactured on a flexible substrate that can be deformed as previously indicated. [0026] The term "electronic paper" can be used to refer to a display technology employing the characteristic of a general ink and different from a general flat display in that the reflected light is used. Electronic paper can change displayed information using a spinning ball or electrophoresis using a capsule. [0027] In a state in which the soft display unit 251 is not deformed (for example in a state with an infinite radius of curvature, referred to as the first state), a display region of the Flexible display unit 251 comprises a generally flat surface. In a state in which the soft display unit 251 is deformed with respect to the first state by an external force (e.g., in a state with a finite radius of curvature, to which reference is made hereinafter as a second state ), the display region can become a curved surface or a folded surface. As illustrated, information displayed in the second state may be visual information delivered on the curved surface. The visual information can be realized in such a way that a light emission of each unit pixel (subpixel) arranged in a matrix configuration is controlled independently. The unit pixel indicates an elementary unit to represent a color. According to an alternative embodiment, the first state of the flexible display unit 251 may be a curved state (for example a curved state from top to bottom or from right to left), instead of being a flat state . In the present embodiment, when an external force is applied to the soft display unit 251, the soft display unit 251 can move into the second state so that the soft display unit is deformed. in the flat state (or in a less curved state) or in a more curved state. [0028] If desired, the flexible display unit 251 may implement a flexible touch screen using a touch sensor in combination with the display. When a touch is received on the flexible touch screen, the controller 180 may perform a certain command corresponding to the touch input. In general, the soft touch screen is configured to detect a touch and another entry in the first state and in the second state. One option is to configure the mobile phone 200 to include a deformation sensor that detects deformation of the soft display unit 251. [0029] The deformation sensor may be included in the detection unit 140. The deformation sensor may be located in the flexible display unit 251 or in the housing 201 to detect information relating to the deformation of the display unit. Flexible display 251. Examples of such information relating to the deformation of the flexible display unit 251 may be a direction of deformation, a degree of deformation, a deformation position, a deformation time, an acceleration to which the deformed flexible display unit 251 is restored, etc. Other possibilities include information that can be detected in response to the curvature of the soft display unit or that can be detected while the soft display unit 251 is or is in the first state and in the first state. second state. In some embodiments, the controller 180 or other component may change information displayed on the flexible display unit 251, or generate a control signal to control a function of the mobile terminal 200, based on the This information is generally detected by the deformation sensor. The mobile terminal 200 shown has a housing 201 for accommodating the flexible display unit 251. The housing 201 can be deformable with the flexible display unit 251, taking into account the characteristics of the flexible display unit 25 251. A battery (not shown in this figure) located in the mobile terminal 200 may also be deformable in cooperation with the flexible display unit 261, taking into account the characteristic of the flexible display unit 251. The technique of implementing such a battery is to use a stacking and folding method for stacking battery cells. The deformation of the flexible display unit 251 is not limited to a deformation performed by an external force. For example, the soft display unit 251 may be deformed from the first state into the second state by user control, application control, etc. According to other embodiments, a mobile terminal can be configured as a device that can be worn on a human body. Such devices go beyond the usual technique according to which the user takes the mobile terminal by hand. Examples of a portable device include a smart watch, smart glasses, a head-mounted display (HMD), and the like. A typical portable device can exchange data (or cooperate) with another mobile terminal 100. In such a device, the portable device generally has a lower functionality than the cooperating mobile terminal. For example, the short-range communication module 114 of a mobile terminal 100 can detect or recognize a portable device that is close enough to communicate with the mobile terminal. In addition, when the detected portable device is an authenticated device for communicating with the mobile terminal 100, the controller 180 can transmit processed data in the mobile terminal 100 to the portable device via the short-range communication module. 114, for example. A user of the portable device can therefore use the processed data in the mobile terminal 100 on the portable device. For example, when a call is received in the mobile terminal 100, the user can answer the call using the portable device. Similarly, when a message is received in the mobile device 100, the user can view the received message using the portable device. Fig. 3 is a perspective view illustrating an example of a watch-type mobile terminal 300 according to another exemplary embodiment. As shown in FIG. 3, the watch-type mobile terminal 300 comprises a main body 301 with a display unit 351 and a bracelet 302 connected to the main body 301 so as to be worn on the wrist. In general, the mobile terminal 300 may be configured to include features identical or similar to those of the mobile terminal 100 of Figs. 1 to 3. [0030] The main body 301 may comprise a housing having a certain appearance. As illustrated, the housing may include a first housing 301a and a second housing 30b cooperatively defining an interior space for accommodating various electronic components. Other configurations are possible. [0031] For example, a single box can be implemented alternatively, being configured to define the interior space, thereby implementing a mobile terminal 300 with a single body. The watch type mobile terminal 300 can perform wireless communication, and an antenna for wireless communication can be installed in the main body 301. The antenna can extend its function using the housing. For example, a housing comprising a conductive material may be electrically connected to the antenna to extend a ground area or a radiation area. The display unit 351 shown is located on the front side of the main body 301 so that the displayed information is visible to a user. In some embodiments, the display unit 351 includes a touch sensor so that the display unit can function as a touch screen. As illustrated, the window 351a is positioned on the first housing 301a to form a front surface of the terminal body with the first housing 301a. [0032] The illustrated embodiment comprises an audio output module 352, a camera 321, a microphone 322, and a user input unit 323 positioned on the main body 301. When the display unit 351 is implemented as a touch screen, additional function keys can be minimized or eliminated. For example, when the touch screen is implemented, the user input unit 323 may be omitted. The bracelet 302 is commonly worn on the wrist of the user and may be made of a flexible material to facilitate the wearing of the device. In one example, the bracelet 302 may be made of fur, rubber, silicon, synthetic resin, etc. The bracelet 302 may also be configured to be detachable from the main body 301. The bracelet 302 may be replaceable by various types of bracelets according to the preferences of a user. In one configuration, the bracelet 302 may be used to extend the performance of the antenna. For example, the bracelet may include therein a mass extension portion (not shown) electrically connected to the antenna for extending a ground area. The bracelet 302 may comprise a fastener 302a. The fastener 302a can be implemented in the form of a loop, a hook, a Velcro fastener, etc. and it comprises a part or a flexible material. The drawing illustrates an example in which the fastener 302a is implemented in the form of a loop. FIG. 4 is a perspective view illustrating an example of a glasses type mobile terminal 400 according to another exemplary embodiment. The mobile terminal type glasses 400 can be worn on the head of a human body and provided with mounts (housing, housing, etc.) thereof. The frames can be made of a flexible material that can be worn easily. The frames of the mobile terminal 400 are shown as having a first mount 401 and a second mount 402, which may be of identical or different materials. In general, the mobile terminal 400 may be configured to include features that are identical or similar to those of the mobile terminal 100 of Figs. 1a-1c. The mounts can be supported on the head and they define a space for mounting various components. As illustrated, electronic components, such as a control module 480, an audio output module 452, etc., can be mounted on the frame part. A lens 403 for covering the left eye and / or the right eye may be detachably coupled to the frame portion. The control module 480 controls various electronic components 20 arranged in the mobile terminal 400. The control module 480 can be considered as being a component corresponding to the aforementioned control member 180. Figure 4 illustrates that the control module 481 is installed in the frame part on one side of the head, but other locations are possible. The display unit 451 may be implemented as a head mounted display (HMD). HMD refers to display techniques in which a display is mounted on a head to represent an image directly in front of a user's eyes. To provide an image directly in front of the user's eyes when the user is wearing the glasses-type mobile terminal 400, the display unit 451 may be located to correspond to the left eye and / or the eye law. Figure 6 illustrates that the display unit 451 is located on a portion corresponding to the right eye to deliver a visible image by the right eye of the user. [0033] The display unit 451 can project an image onto the user's eye using a prism. The prism can be made of an optically transparent material so that the user can see both the projected image and a general field of view (a range that the user sees with his eyes) in front of the user. In this way, the image delivered by the display unit 451 can be seen overlapping with the general field of view. The mobile terminal 400 can provide augmented reality (AR) by superimposing a virtual image on a realistic image or a background using the display. [0034] The camera 421 may be located adjacent to the left eye and / or the right eye for acquiring an image. Since the camera 421 is located adjacent to the camera 421 can acquire a scene that the user is seeing. The camera 421 can be positioned at virtually any location of the mobile terminal. In some embodiments, multiple cameras 421 may be used. Several cameras 421 can be used to acquire a stereoscopic image. The glasses-type mobile terminal 400 may comprise user input units 423a and 423b, each of which may be manipulated by the user to provide an input. User input units 423a and 423b may employ techniques for input via a touch input. Typical touch inputs are in particular a touch, a push or the like. The user input units 423a and 423b are shown as being usable by pushing and touching since they are respectively on the frame part and on the control module 480. [0035] If desired, the mobile terminal 400 may include a microphone that processes an input sound into electrical audio data, and an audio output module 452 for outputting the audio. The audio output module 452 may be configured to produce the audio in a general audio output manner or in an osteoconductive manner. When the audio output module 452 is implemented in the osteoconductive manner, the audio output module 452 can adhere tightly to the head when the user is wearing the mobile terminal 400 and vibrates the user's skull to transfer data. sounds. [0036] Other preferred embodiments will be described in detail hereinafter with reference to other figures. Those skilled in the art may realize that the present features can be realized in various forms without departing from the scope of the present. [0037] For ease of description, the present invention assumes that the display unit 151 employs a touch screen 151. The touch screen 151 may simultaneously perform an information display function and an information input function. FIG. 5 illustrates an example of a mobile terminal 500 according to the present invention, comprising a front surface display area 551 and a side surface display area 552. The side surface display area 552 can be performed by extending the front surface display area 551. While the front surface display area 551 is flat, the side surface display area 552 may be rounded. The two display areas of Fig. 5 are distinguished from each other by a dotted line which indicates that the two actual display areas can be physically separated from each other or not from each other. to be. In other words, the two display areas can be realized as a single physically connected display area and the dotted line is only intended to indicate that both display areas can be used by being functionally separated from each other. This assumption can be applied in the same way to other examples which will be discussed below. The front surface display area 551 may act as a main display of the mobile terminal 500, while the side surface display area 552 may act as an auxiliary display. This assumption is also applied to other embodiments which will be discussed hereinafter. For example, the front surface display area 551 may display a background screen, while the side area display area 552 may display an indicator area. The indicator area may display the current time, a communication status of at least one communication module, and a battery meter reading. In another example, the front surface display area 551 may display an execution screen of a running application, while the front surface display area 552 may display a graphical user interface (GUI). ) to perform a function related to the running application. Fig. 6 illustrates a method of displaying a mobile terminal according to claim 5. Fig. 6 (a) shows part of a cross-section of the movable terminal 500. As shown in Fig. 6 (a), the display surface of the front surface display area 551 is flat, but the display area of the side surface display area 552 is rounded. Due to the structure, distortion is observed in an image displayed on the side surface display area 552. The controller 180 of the mobile terminal 500 compensates for image distortion. In other words, the controller 180 processes image data corresponding to the side surface display area 552 so that the image data appears in a normal manner on the actually projected surface on the eye . Fig. 6 (b) illustrates a conceptual layout of display areas for the front of the mobile terminal 500. Referring to Fig. 6 (b), the user can see that the side surface display area 552 is disposed on the right side of the front surface display area 551 and both areas are used to display image data. Fig. 7 illustrates another example of a mobile terminal 500 according to the present invention, comprising a front surface display area 551 and a side surface display area 552. The side surface display area 552 may be formed as an extension of the front surface display area 551. Unlike the example of Fig. 5, the display surface of the side surface display area 552 may be a surface flat plate forming a slope with respect to the front surface display area 551. Fig. 8 illustrates a display method for a mobile terminal 500 of Fig. 7. Fig. 8 (a) shows a portion of a cross-section of the mobile terminal 500. With reference to Fig. 8 (a), the display surface of the side surface display area 552 may be a flat surface forming a slope with respect to the display surface of the area of the display area. front surface display 551 in a direction to the bottom. Because of this structure, an image displayed on the side surface display area 552 is distorted. By compensating for image distortion, the image on the surface 553 actually projected onto the eye may be devoid of any distortion. Fig. 8 (b) illustrates a conceptual layout of display areas for the front of the mobile terminal 500. With reference to Fig. 8 (b), it can be seen that the side surface display area 552 is disposed on the right side of the front surface display area 552 and both areas are used to display image data. Figure 9 illustrates exemplary combinations of a front surface display area and side surface display areas, which may be implemented in a mobile terminal according to the present invention. It should be noted that the combination examples only take into account a conceptual layout of the display areas for the front of the mobile terminal. Referring to Fig. 9 (a), a mobile terminal according to the present invention may comprise two rectangular-shaped side surface display areas disposed on the left side and the right side of a front surface display area. rectangular shape. It should be noted that the front surface display area and the side surface display areas are not limited to a rectangular shape. This form assumption is applied in the same way to other examples described below. As described above, the display surface of the side surface display area may be curved or flat. The display surface of the side surface display area may form a slope with respect to the display surface of the front surface display area. This assumption can be applied in the same way to other examples which will be described hereinafter. Referring to Fig. 9 (b), a mobile terminal according to the present invention may include a side surface display area disposed in the upper portion of the front surface display area. Referring to Fig. 9 (c), a mobile terminal according to the present invention may comprise two side surface display areas arranged separately in the upper portion and in the lower portion of the front surface display area. [0038] Referring to Fig. 9 (d), a mobile terminal according to the present invention may include three side surface display areas disposed on both sides as well as in the upper portion of the front surface display area. Referring to Fig. 9 (e), a mobile terminal according to the present invention may include three side surface display areas disposed on both sides as well as in the lower portion of the front surface display area. Referring to Fig. 9 (1), a mobile terminal according to the present invention may comprise four side surface display areas arranged on both sides as well as in the upper part and in the lower part of the surface display area. end. Fig. 10 is a flowchart illustrating an example of a method of operating a mobile terminal according to the present invention. The operating method of a mobile terminal will be described hereinafter with reference to the corresponding accompanying drawings. [0039] A particular image is displayed on the front surface display area S100. The particular image may be a result of processing a particular image file or image acquisition of a displayed screen of the front surface display area according to a user's operation. The particular image may correspond to one of several images displayed on the front surface display area. When the particular image is displayed on the front surface display area, a slip movement with respect to the particular image in one direction to the side surface display area is received S110. In the case where the particular image is the only object displayed on the front surface display area, a sliding touch input at an arbitrary position of the front surface display area to the area display area lateral corresponds to a slipping of the particular image towards the lateral surface display area. If the particular image corresponds to one of several images displayed on the front surface display area, a slip touch input received on the particular image to the side surface display area may correspond to a slip from the particular image to the side surface display area. If a sliding movement to the side surface display area with respect to the particular image is received, the controller 180 activates a modification function for the particular image S120. The controller 180 may activate a modification function for the particular image only when the particular image is slidably moved beyond the front surface display area to the side surface display area. [0040] The editing function for the particular image includes deleting at least a portion of an image, cutting at least a portion of an image, changing a color for the at least a portion of an image. image, an image rotation, and a duplication of at least a portion of an image. Nevertheless, the technical scope of the present invention is not limited to the above example. [0041] After the modification of the particular image on the basis of the image modification function activated, the controller 180 stores the modified image in the memory 170 according to a user's operation or in the case where a predetermined condition is fulfilled S130. At this point, the user's operation includes manipulation of an element included in the graphical user interface provided to perform the modification function and a hardware button operation previously mapped to a storage function for a modified image. . The predetermined condition may include the case where the change function is not performed for a time longer than a predetermined time. [0042] FIGS. 11 to 13 illustrate exemplary embodiments of a modification function for an image displayed on a front surface display area according to the operating method of a mobile terminal of FIG. 10. In the examples of FIGS. 11 to 13, it is assumed that the display unit 151 includes the front surface display area 551 and the side surface display areas 552 and 554 disposed on the left side and the right side of the area. Fig. 11 (a) illustrates a case in which thumbnails 551A corresponding to a plurality of image files are displayed on the front surface display area 551. For example, the screen can correspond to an execution screen of an image viewing application. The mark "V" indicates a thumbnail image corresponding to a particular image chosen from among the plurality of thumbnails 551A. The selection of the particular image can be performed based on a simple touch input of the user. [0043] If a thumbnail image corresponding to the particular image is selected as shown in Fig. 11 (a), the controller 180 displays a screen for executing the particular image file on the display area of the image. front surface 551, as shown in Fig. 11 (b). Fig. 11 (c) illustrates a case in which a slip of the particular image to the side surface display area 552 is performed while the particular image is displayed on the front surface display area 551. The Fig. 12 (a) illustrates a case in which a particular image is slid past the side surface display area 552. The controller 180 then activates a modification function with respect to the particular image. According to the practice of the present invention, the controller 180 may activate a modification function for the particular image in the case where a sliding motion on the particular image is maintained for a predetermined time. For example, in a case where a touch input for selecting the particular image is dragged to the side surface display area 552 and the touch input is held for a predetermined duration (for example, 1 second), the The controller 180 may activate a modification function for the particular image. Once a modification function for the particular image is activated, the controller 180, as shown in Fig. 12 (a), can display a graphical user interface (GUI) to perform a modification function for the particular image on the side surface display area 552 and display a portion of the particular image slid past the front surface display area 551 on the side surface display area 552. [0044] The graphical user interface may include icons representing a function for deleting or trimming the portion of the particular image that has passed into the side surface display area, a rotation function for the particular image, a function of cancellation of modification for the particular image, and a storage function for a modified image. Nevertheless, the technical scope of the present invention is not limited to the above example. The controller 180 may display a graphical user interface for performing a modification function with respect to the particular image on another side surface display area 554 disposed on the left side of the front surface display area. 551. The controller may display guide lines 556 to control the amount of a particular image that has been slid into the side surface display area 552 from the front surface display area 551 on the boundaries 551, 552 of the two display areas. Fig. 12 (b) illustrates an operation of performing a function of deleting a portion of the particular image that has passed into the side surface display area 552. More specifically, Fig. 12 (b) illustrates a case in which the user makes a touchdown movement up and down the side surface display area 552 and deletes the portion of the particular picture that has been dragged into the display area of Lateral surface 552. A deleted portion of the particular image may be separately stored in memory 170. This scheme may be similar to a function of splitting an image file or text. According to another embodiment of the present invention, the controller 180 may delete the portion of the particular image that has been slid into the side surface display area 552 by selecting a delete icon included in an interface. graphical user supplied in the lateral surface display area 552. According to another embodiment of the present invention, the controller 180 may perform a function of automatically deleting the portion of the particular image that has been dragged in the case where the particular image is slid into the side surface display area 554 in the opposite direction, the controller 180 may automatically delete a left portion of the particular image that has been slid into the side surface display area 554. Fig. 12 (c) illustrates a case in which thumbnail images representing Further changes to the particular image are displayed on another side surface display area 554 disposed in the opposite direction of the side surface display area 552. According to another embodiment of the present invention the other side surface display area 554 may provide a graphical user interface corresponding to a modification process with respect to the particular image or a graphical user interface for performing a modification function with respect to the particular image. Fig. 13 (a) illustrates a case in which the user selects an icon corresponding to an image rotation function included in a graphical user interface to perform an image modification function displayed on the surface display area 552 and rotates a particular image by a touching motion on the particular image displayed on the front surface display area 551. After the rotation of the particular image, the user can perform a delete function for a particular image. upper part of the particular image by dragging the rotated image into the side surface display area 552. In this process, the delete function can be performed for a left side area or a lower part of the particular image. Fig. 13 (b) illustrates a case in which the user selects an icon corresponding to a modified image storage function included in a graphical user interface to perform an image editing function displayed on the display area of the display. lateral surface 552 and stores a modification result with respect to a particular image. FIG. 14 illustrates another exemplary embodiment of a modification function for an image displayed on a front surface display area according to the operating method of a mobile terminal of FIG. 10. FIG. ) illustrates a case in which the user touches a particular website screen displayed on the front surface display area 551 for a time longer than a predetermined time. The controller 180 then performs the acquisition of the Web site screen and, as shown in Fig. 14 (b), displays the acquired image of the website screen on the front surface display area 551. Fig. 14 (b) illustrates a case wherein the user selects an acquired image of the website screen by applying a touch input to the front surface display area 551 while the acquired image of the website screen is displayed on the front surface display area 551 and fai t drag the image beyond the front surface display area 551 into the side surface display area 552. [0045] The controller 180 then displays a portion of the acquired image of the website screen that has been slid past the front surface display area 551 into the side surface display area 552 and activates a modification function for the acquired image of the website screen. [0046] Fig. 14 (c) illustrates a case in which an operation of performing a delete function for the portion of the acquired image of the website screen slid past the side surface display area 552 is performed. More specifically, Fig. 14 (c) illustrates a case in which the user applies a touch movement by rubbing on the side surface display area 552 up and down and removes the portion of the acquired image. of the website screen slid past the side surface display area 552. Fig. 15 shows a modified original image according to an image modification process of Fig. 14 and an image obtained as a that result of modification. The original image of Fig. 15 (a) represents an acquired image of a particular web page. An illustrative process of creating image 600 will be described hereinafter. First, the user moves the original image to the side surface display area 552 and deletes the portion on the right side of the image 559 that the user wishes to extract from the image of the image. origin. Then, by rotating the image whose right part was deleted, the user deletes an upper part, a left part, and a lower part of the image 559 one after the other, by obtaining 560. A different example will be described below in which the image 600 is created in the case where side surface display areas are arranged on the left side and on the right side as well as in the upper part and in the lower part of the front surface display area 551. The user slides the original image to the side surface display area on the right side, it removes the part on the right side of the image 559 compared to the original image, it slides the original image to the side surface display area on the left side, it removes the part on the left side of the image 559 compared to the original image, it slides the original image ine to the side surface display area on the top side, it removes the part on the upper side of the image 559, it slides the original image to the side surface display area on the side lower, and it removes the portion on the lower side of the image 559, thereby obtaining the image 560. Fig. 16 is a flowchart illustrating another example of a method of operating a mobile terminal according to the present invention. It will be described hereinafter the method of operation of a mobile terminal with reference to the accompanying drawings. Several elements are displayed on the front surface display area S200. The multiple items may include application execution icons or widgets corresponding to application execution icons and applications. At this point, the front surface display area can perform the role of displaying a background screen. The front surface display area may include thumbnail images corresponding to a plurality of image files. At this point, the front surface display area may be displaying an execution screen of an image viewing application. It should be noted, however, that the composition of the several elements is not limited to the above-mentioned examples. If a first touch input is received in the front surface display area while the plural items are displayed on the front surface display area, the controller 180 selects one of the plurality of items on the front surface display area. base of the first tueher entry received S210. If the first touch input is slid into the side surface display area after selection of an item based on the first touch input, the controller 180 activates an item change function relative to to the several items displayed on the front surface display area S230. At this point, the selected element can be a modification object. An element editing function may include a delete function for a single element, an element disposition change function, an element size change function, and an element property change function. Nevertheless, the technical scope of the present invention is not limited to the above example. The element modification function can be kept active even if the element modification function is activated, another element is selected among the several elements by a touch input, and the touch input is dragged into the d lateral surface display. If a touch input for selecting an item is slid into the side surface display area, the controller 180 may display a thumbnail image of the item corresponding to the touch input slid over the area of the display area. side surface display. Then, looking at a thumbnail image of the element displayed on the side surface display area, the user can visually identify the element that is a modification object. Fig. 17 is a flowchart illustrating another example of a method of operating a mobile terminal according to the present invention. The operating method of a mobile terminal will be described hereinafter with reference to the accompanying drawings. It should be noted that the method of operation of a mobile terminal may be an example of the operating method of a mobile terminal of FIG. 16. While several elements are displayed on the terminal front surface display area of the terminal mobile, a sliding touch input to the lateral surface display area with respect to the plural elements is received S300. The controller 180 then activates an element modification function and displays thumbnail images corresponding to the plural elements on the S310 side surface display area. Displaying thumbnail images on the side surface display area may indicate that an item corresponding to a thumbnail image has become a modification object. The controller 180 divides the front surface display area into a plurality of sub-areas, a thumbnail image displayed on the side surface display area is touched and slid into one of the plurality of sub-areas, the control member 180 has an element corresponding to the thumbnail image slid into the corresponding sub-area S330. At this stage, the controller 180 can change the size of an element disposed so that it corresponds to the size of a sub-area in which the thumbnail image has been dragged. According to the embodiment of the present invention, the controller may change the properties of an item corresponding to a thumbnail image displayed in the sub-area in which the thumbnail image has been dragged according to the size of the sub-area. [0047] If a storage command is received after arranging items corresponding to the thumbnail images in the multiple sub-areas based on a touch input slid with respect to the thumbnail images displayed in the side surface display area, the controller 180 stores a state of arrangement of elements with respect to the several sub-areas S340. Figs. 18 to 20 illustrate examples in which an element modification function is performed according to the operating method of a mobile terminal of Fig. 17. Fig. 18 (a) illustrates a case in which an image object 561 corresponding to a plurality of image files is displayed in the front surface display area 551. The image object may correspond to a thumbnail image representing an image file or a result of executing the file. image. The user may select the screen indicated by a "V" mark from among a plurality of image objects 561 of Fig. 18 (a) by making a touch movement. The selection of an image object can be performed by a touch operation such as a long touch input or an entry by typing twice. The display screen of the front surface display area 551 may be an execution screen of an image viewing application. Fig. 18 (b) illustrates a case in which the user slides the selected image objects to the side surface display areas 552, 554 disposed on both sides of the front surface display area 551. As this is shown in Fig. 18 (b), the controller 180 displays thumbnail images 562 corresponding to the selected picture objects on the side surface display areas 552 and 554. [0048] Fig. 18 (c) illustrates a case in which the user performs a touch operation to delete image objects displayed on the front surface display area 551 while the thumbnail images 562 are displayed on the display areas. display of side surfaces 552, 554. As shown in Fig. 18 (c), the touching operation may be a touching movement by rubbing the front surface display area 551 with a finger as a moving motion. rubbing with an eraser. Nevertheless, the touch movement to delete an image object displayed on the front surface display area 551 is not limited to the above example. For example, a touch gesture to delete an image object displayed on the front surface display area 551 may be a touch gesture such as a long touch input or an entry by tapping the display area twice. front surface 551 or on the side surface display areas 552, 554. [0049] Fig. 18 (a) illustrates a case in which the user performs a touch operation to divide the front surface display area 551 into a plurality of sub-areas after the deletion of image objects displayed on the area of front surface display 551 using the touch motion of Fig. 18 (c). As shown in Fig. 18 (b), the controller 180 can then divide the front surface display area 551 into a plurality of sub-areas (zone 1 to zone 5). Nevertheless, depending on the case of implementation of the present invention, the front surface display area 551 can be directly subdivided into several areas on the basis of a touch movement to distinguish sub-areas as shown in FIG. Fig. 19 (a) skipping the deletion of picture objects as shown in Fig. 20 (c). In the case where an icon 563 for zone segmentation is selected from a graphical user interface provided in the side surface display area 552 as shown in Fig. 19 (a), the controller 180 may subdivide the front surface display area 551 into a plurality of areas based on a touch input received on the front surface display area 551. Fig. 19 (c) illustrates a case in which touching the thumbnail images 562 displayed on the side surface display areas 552, 554 are dragged into the respective sub-areas. The controller 180 then displays an element corresponding to a thumbnail image slid in proportion to the size of the corresponding area. At this stage, the controller 180 may change the size of the corresponding element based on the size of the sub-area in which the corresponding thumbnail image is dragged. Fig. 20 (a) illustrates a case in which a plurality of elements are disposed on the front surface display area 551 and, based on a touch and slip movement relative to the arranged elements, images in thumbnails corresponding to the elements are displayed again on the side surface display areas 552, 554. In other words, the mobile terminal provides a remodification function for rearranging elements arranged on a plurality of sub-areas. The graphical user interface 564 displayed on the side surface display area 552 of Fig. 20 (a) may include icons for performing various functions necessary for an element editing function. Fig. 20 (b) illustrates a case in which elements are rearranged on the several sub-areas of the front surface display area 551 and the user stores an item layout state by selecting a storage icon 565 provided at the side surface display area 552. At this point, in the case where arranged elements correspond to image files, a new image file combining several image files can be created on the basis of the execution of the memory function. Fig. 21 shows examples of creating a new image file based on the image modification method illustrated in Figs. 18 to 20. Fig. 21 (a) illustrates a case in which the region of front surface display of a mobile terminal according to the present invention is subdivided into seven areas, image files are arranged in the respective seven areas, a storage function is performed, and a new image file combining the seven files of images is created. Fig. 21 (b) illustrates a case in which the front surface display area of a mobile terminal according to the present invention is subdivided into three areas, image files are arranged in the respective three areas, a function of memorization is performed, and a new image file combining the three image files is created. Fig. 22 illustrates another example of executing an element modification function according to the operating method of a mobile terminal of Fig. 17. Fig. 22 (a) is one of execution screens an image gallery application and illustrates a case in which an element modification function (in this case a folder size change function) is activated while 566 image objects corresponding to the six Folders are displayed on the front surface display area 551. Fig. 22 (a) further illustrates a case in which picture objects 566 corresponding to the folders are selected by a touch input and are dragged to the dice areas. display of side surfaces 552, 554. As shown in Fig. 22 (b), thumbnail images 567 corresponding to the folders are then displayed on the side surface display areas 552, 554. The display area front surface 55 1 is subdivided into several sub-areas. The subdivision can be configured by the touch input of the received user on the front surface display area 551, as shown in Fig. 19 (a). In the arrangement of Fig. 22 (b), the respective thumbnail images 567 displayed on the side surface display areas 552, 554 are touched and slid into the corresponding sub-areas. As shown in Fig. 22 (c), the controller 180 then displays folder image objects corresponding to the thumbnail images slid on the front surface display area 551. At this point, like this is shown in Fig. 22 (c), the controller 180 may change the size of the image object of the folder based on the size of the corresponding area. FIG. 23 illustrates another example of execution of an element modification function according to the operating method of a mobile terminal of FIG. 17. [0050] Fig. 23 (a) illustrates a case in which an image modification function is activated in an execution screen of a contact management application representing a plurality of contact elements 568 in a buddy list. Fig. 23 (a) further illustrates a case in which contact elements are selected by a touch input and are slid into the side surface display areas 552, 554. As shown in Fig. 23 (b ), the controller 180 then displays thumbnail images 569 corresponding to the contact elements on the side surface display areas 552, 554. The front surface display area 551 is subdivided into a plurality of areas. As shown in Fig. 19 (a), the subdivision can be performed based on the received user touch input on the front surface display area 551. In the arrangement of Fig. 23 (Figs. b), the user touches and slides individual thumbnail images 567 displayed on the side surface display areas 552, 554 in the corresponding sub-areas. As shown in Fig. 23 (c), the controller 180 then displays an image object of a contact element corresponding to the thumbnail image slid on the front surface display area 551. At this point, as shown in Fig. 23 (c), the size of the image object corresponding to the contact element can be changed based on the size of the corresponding area. As shown in Fig. 23 (c), the controller 180 may change the image object of the contact element to another image object corresponding to the contact element. For example, a heart-shaped image object 570 among the image objects may be previously assigned to the contact elements classified in a group of loved ones. A family photo 571 may be an image object previously assigned to the contact elements classified as members of the family. An image object of a building can represent a person from the corresponding contact element who works in the building. FIG. 24 illustrates another example of execution of an element modification function according to the operating method of a mobile terminal of FIG. 17. [0051] Fig. 24 (a) illustrates a case in which an image editing function is activated in a background screen displaying a plurality of icons. Fig. 24 (a) further illustrates a case in which icons are selected by a touch input and are dragged into the side surface display areas 552, 554, and thumbnail images 574 corresponding to the icons are displayed on the side surface display areas 552, 554. Fig. 24 (b) illustrates a case in which the front surface display area 551 is subdivided into a plurality of sub-areas. As illustrated in Fig. 19 (a), the subdivision can be performed based on the received user touch input on the front surface display area 551. In the arrangement of Fig. 24 ( b), the user touches and slides individual thumbnail images 573 displayed on the side surface display areas 552, 554 in the corresponding sub-areas. [0052] As shown in Fig. 24 (c), the controller 180 displays image objects (icons or widgets) corresponding to the thumbnail images dragged onto the front surface display area 551. At this point, as shown in Fig. 24 (c), the controller 180 may change the size of the displayed image object to the corresponding subfield based on the size thereof. In the case where the sub-area in which a thumbnail image has been dragged has a size larger than a predetermined size, the controller 180 displays a widget corresponding to the application instead of an execution icon of the application on the sub-area. In the case in which an item displayed on the original background screen is a widget of a particular application and the size of the subfield in which the corresponding thumbnail has been dragged is less than a predetermined size , the controller can display a particular application's execution icon instead of a particular application's widget on the subfield. In other words, the controller 180 can change the property of an element displayed on the sub-area in which the corresponding thumbnail image has been dragged based on the size of the sub-area. Fig. 25 illustrates another example of an element modification method performed in a mobile terminal according to the present invention. [0053] Fig. 25 (a) illustrates a case in which a plurality of elements 575 including application execution icons and folder icons are displayed on the front surface display area 551, and the user selects elements by applying a touch input and sliding the touch input to the side surface display areas 552, 554. The controller 180 then displays a thumbnail image 576 corresponding to the selected element on the zones. As shown in Fig. 25 (b), if a self-arranging control is received in the arrangement of Fig. 25 (a), the controller 180 creates a new background screen and automatically arranges the elements corresponding to the thumbnail images displayed in the side surface display areas 552, 554 on the new background screen of the front surface display area 551. [0054] In another embodiment of the present invention, as shown in Fig. 19 (a), in the case where the front surface display area 551 is subdivided into sub-areas in the arrangement of the In Fig. 25 (a) by the touching movement of the user and a self-arranging control is received, the controller 180 can automatically re-arrange the elements corresponding to the thumbnail images in the sub-areas. Fig. 26 illustrates another example of an element modification method performed in a mobile terminal according to the present invention. Fig. 26 (a) illustrates a case in which the user repeats a touch movement moving from one side to the other on the front surface display area 551 while a background screen comprising a plurality of elements is displayed on the front surface display area 551. As shown in Fig. 26 (b), the controller 180 displays thumbnail images corresponding to the items displayed on the area of the display area. front surface display 551 on the side surface display areas 552, 554 based on the touch movement. While the thumbnail images are displayed on the side surface display areas 552, 554, the user again repeats a touch movement moving from one side to the other on the front surface display area 551. As shown in Fig. 26 (c), the controller 180 recovers the original background screen and removes the thumbnail images displayed on the side surface display areas 552. 55. Fig. 27 is a flowchart illustrating another example of a method of operating a mobile terminal according to the present invention. The mobile terminal operation method will be described hereinafter with reference to the accompanying drawings. While a plurality of elements are displayed on the front surface display area 551, S400, an item is selected by a touch input and the touch input is dragged to the side surface display area S410 . The touching and sliding movement can be performed with respect to various elements. The controller 180 displays a thumbnail image of a selected element on the side surface display area and displays a graphical user interface for sharing the selected element on the S420 side surface display area. The graphical user interface may include icons for selecting an application to perform a sharing function of a selected element. Applications for this purpose can include various types of SNS applications, message writing applications, e-mail writing applications, and element-sharing applications employing wireless communication. short range like Bluetooth or WiFi. Nevertheless, the technical scope of the present invention is not limited to the above examples. If an icon for performing a function of sharing an item selected by the graphical user interface is selected, the controller 180 performs a function of sharing the selected element using a selected application S430. FIG. 28 illustrates an exemplary execution of a function of modifying an element displayed on a front surface display area 551 according to the method of operation of a mobile terminal of FIG. 27. FIG. a) illustrates a case in which a plurality of elements displayed on the front surface display area 551 are selected by a touch input and are slid into the side surface display area 552. The controller 180 then displays thumbnail images 557 of the selected items 20 on the side surface display area 552 based on the sliding touch input and displays a graphical user interface 578 to select an application for sharing the selected items on the screen. another side surface display area 554. The user can then perform a function of sharing the selected items by selecting an icon of an application with the Performing a sharing function on the graphical user interface 578. As shown in Fig. 28 (b), the controller 180 may provide a graphical user interface 579 for performing a delete function or copying a selected one on the basis of the touch and slide movement to the other side surface display area 554. Various embodiments may be implemented using a machine readable medium on which are stored instructions to be executed by a processor to perform various methods described herein. Examples of possible machine readable media include a hard disk (HDD), a solid state disk (SSD), a solid state disk (SDD), a read only memory (ROM), a random access memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, the other types of storage media described herein, and combinations thereof. If desired, the machine-readable medium may be embodied as a carrier wave (e.g., a transmission on the internal side.) The processor may include the controller 180 of the mobile terminal. The present teachings may be immediately applied to other types of methods and apparatus.The present description is intended to be illustrative and should not be construed as limiting the present disclosure. The scope of the appended claims, numerous variations, modifications and variations will be apparent to those skilled in the art.The features, structures, methods and other features of the exemplary embodiments described herein may be combined in a variety of ways. for exemplary and / or alternative exemplary embodiments. While these features may be embodied in various forms without departing from their features, it should be understood that the embodiments described above are not limited by the details of the foregoing description unless otherwise indicated, but they should be widely considered. within the scope of the present disclosure as defined in the appended claims. All changes and modifications within the scope of the appended claims or within limits equivalent to the limits of the appended claims are therefore intended to be encompassed within the scope of the appended claims.
权利要求:
Claims (17) [0001] REVENDICATIONS1. A mobile terminal (100, 200, 300, 400), comprising: a display unit (151, 251, 351, 451) configured to incorporate a touch screen function and to include a front surface display area (551 ) and a side surface display area (552, 554); and a controller (180) configured to activate a modification function for an image displayed on the front surface display area (551) in the case where the image is dragged beyond the display area of front surface (551) in a direction to the side surface display area (552, 554). [0002] The mobile terminal (100, 200, 300, 400) according to claim 1, wherein a portion of the image slid beyond the front surface display area (551) is displayed on the display area. lateral surface (552, 554). [0003] The mobile terminal (100, 200, 300, 400) according to claim 1, wherein the controller (180) is configured to activate a modification function for the image in the case where a sliding movement on the image is maintained for a predetermined duration. [0004] The mobile terminal (100, 200, 300, 400) according to claim 2 or 3, wherein the controller (180) is configured to cut off a portion of the slipped image beyond the region of front surface display (551) with respect to the image. [0005] The mobile terminal (100, 200, 300, 400) according to claim 2 or 3, wherein the controller (180) is configured to perform a modification function for a portion of the image slid beyond the front surface display area (551). [0006] The mobile terminal (100, 200, 300, 400) according to claim 2 or 3, wherein the controller (180) is configured to display on the side surface display area (552, 554). , a graphical user interface (564) for performing a modification function for a portion of the image slid beyond the front surface display area (551). [0007] The mobile terminal (100, 200, 300, 400) according to claim 2 or 3, wherein the display unit is further configured to include an additional side surface display area (552, 554) disposed opposite the side surface display area (552, 554), and the controller (180) is configured to display at least one of a graphical user interface (564) to perform a function for modifying the image, a result of modification with respect to the image, and an image modification process on the additional side surface display area (552, 554). [0008] The mobile terminal (100, 200, 300, 400) according to claim 2 or 3, wherein, in a case where a touch input of a predetermined pattern is received on the front surface display area (551 ) or on the side surface display area (552, 554), the controller (180) is configured to acquire a screen displayed on the front surface display area (551) and display the acquired image on the front surface display area (551). 20 [0009] A mobile terminal (100, 200, 300, 400), comprising: a display unit (151, 251, 351, 451) configured to incorporate a touch screen function and to include a front surface display area (551) and a side surface display area (552, 554); and a controller (180) configured to select an item based on a first touch input received on the front surface display area (551) displaying a plurality of items and to activate a change function in the front surface display area (551) if the first touch input is slid into the side surface display area (552, 554). 30 [0010] The mobile terminal (100, 200, 300, 400) according to claim 9, wherein the controller (180) is configured to display a thumbnail image corresponding to the selected item on the display area of the display. lateral surface (552, 554) based on the sliding of the first touch input to the side surface display area (552, 554); to subdivide the front surface display area (551) into a plurality of sub-areas 5 based on at least a second touch input received on the front surface display area (551); and if a thumbnail image displayed on the side surface display area (552, 554) is selected by a third touch input and the third touch input is dragged into one of the plurality of sub-areas, arrange an item corresponding to the thumbnail image selected on the sub-area. [0011] The mobile terminal (100, 200, 300, 400) according to claim 10, wherein the controller (180) is configured to change a size of an element corresponding to the selected thumbnail image disposed on the sub-frame. -zone so that it is proportional to the sub-area. [0012] The mobile terminal (100, 200, 300, 400) according to claim 11, wherein the plurality of elements are image objects representing a plurality of image files, and if image objects corresponding to image files are respectively arranged on two or more zones among the several sub-areas and a storage command is received, the controller (180) is configured to create and store a new image file in the form a combination of image files corresponding to two or more subfields. 25 [0013] The mobile terminal (100, 200, 300, 400) according to claim 11, wherein the controller (180) is configured to change properties of an item corresponding to the selected thumbnail image displayed on the sub-frame. -zone based on the size of the subzone. 30 [0014] The mobile terminal (100, 200, 300, 400) according to claim 11, wherein, in the case where the selected element is an execution icon of a particular application and a size of at least one -zone is greater than a predetermined size, the controller (180) is configured to change an item corresponding to the selected thumbnail image into a widget corresponding to the particular application and to display the widget corresponding to the particular application on the sub-area. [0015] The mobile terminal (100, 200, 300, 400) according to claim 14, wherein, in the case where the selected element is a widget corresponding to a particular application and a size of the modified image object is smaller than at a predetermined size, the controller (180) is configured to change an item corresponding to the selected thumbnail image of a widget corresponding to the particular application into an execution icon of the particular application and to display the run icon of the particular application on the subfield. [0016] The mobile terminal (100, 200, 300, 400) according to claim 9, wherein, if the first touch input relative to each of two or more of the plurality of elements is slid to the display area. of the lateral surface (552, 554), the control member (180) is configured to display thumbnail images corresponding to the two or more elements on the lateral surface display area (552, 554) to create a new one. background screen based on a self-layout command and to automatically arrange the two or more elements on the new background screen. [0017] The mobile terminal (100, 200, 300, 400) according to claim 16, wherein the controller (180) is configured to subdivide the front surface display area (551) into a plurality of sub-areas on the basis of at least a third touch input received on the front surface display area (551) and to create the new automatically arranged background screen in which the two or more individual elements are automatically arranged in the sub-areas. corresponding areas among the several sub-areas based on the auto-layout control.
类似技术:
公开号 | 公开日 | 专利标题 FR3021133B1|2019-08-30|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3031601B1|2019-08-30|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3022368B1|2019-06-21|WATCH-TYPE TERMINAL AND CONTROL METHOD THEREOF FR3021424B1|2019-09-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3025328B1|2019-07-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3024786A1|2016-02-12|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME FR3021767A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME FR3026201A1|2016-03-25| FR3021485A1|2015-11-27|MOBILE DEVICE AND METHOD OF CONTROLLING THE SAME FR3021766A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3022370A1|2015-12-18|MOBILE TERMINAL FR3022367A1|2015-12-18| FR3019665A1|2015-10-09| FR3022649A1|2015-12-25| FR3021136A1|2015-11-20|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME US20160054567A1|2016-02-25|Mobile terminal, glasses-type terminal, and mutual interworking method using screens thereof US20160098187A1|2016-04-07|Mobile terminal and control method thereof US10101818B2|2018-10-16|Mobile terminal and method for controlling the same FR3046470B1|2019-11-08|MOBILE TERMINAL FR3043478A1|2017-05-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME EP3324582A1|2018-05-23|Mobile terminal and method for controlling the same FR3041785A1|2017-03-31|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021135A1|2015-11-20| FR3039674A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3042084B1|2019-11-08|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
同族专利:
公开号 | 公开日 KR20150144665A|2015-12-28| KR102218041B1|2021-02-19| FR3022370B1|2018-09-28| CN105474159B|2020-11-27| US20160266774A1|2016-09-15| CN105474159A|2016-04-06| EP3160050A4|2018-05-30| WO2015194694A1|2015-12-23| US10126943B2|2018-11-13| EP3160050A1|2017-04-26|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US8504932B2|2006-04-13|2013-08-06|Shutterfly, Inc.|Image collage builder| US7565618B2|2003-02-13|2009-07-21|LumaPix Inc.|Method and system for distributing multiple dragged objects| US8365084B1|2005-05-31|2013-01-29|Adobe Systems Incorporated|Method and apparatus for arranging the display of sets of information while preserving context| JP4909576B2|2005-11-29|2012-04-04|株式会社リコー|Document editing apparatus, image forming apparatus, and program| US20070162844A1|2006-01-12|2007-07-12|Microsoft Corporation|Automatic layout of objects| US7773075B2|2006-03-22|2010-08-10|Panasonic Corporation|Display apparatus| US10503342B2|2006-08-04|2019-12-10|Apple Inc.|User interface spaces| KR101398134B1|2007-10-04|2014-05-20|엘지전자 주식회사|Apparatus and method for playing moving-picture in mobile terminal| US8381123B2|2008-06-27|2013-02-19|Microsoft Corporation|Positioning and realizing of virtualized visible content within a viewport| US11017160B2|2008-07-03|2021-05-25|Ebay Inc.|Systems and methods for publishing and/or sharing media presentations over a network| KR100980683B1|2008-09-01|2010-09-08|삼성전자주식회사|Apparatus and method for providing user interface to generate menu list of potable terminal| US8600446B2|2008-09-26|2013-12-03|Htc Corporation|Mobile device interface with dual windows| US9760234B2|2008-10-14|2017-09-12|International Business Machines Corporation|Desktop icon management and grouping using desktop containers| JP4900361B2|2008-10-21|2012-03-21|ソニー株式会社|Image processing apparatus, image processing method, and program| US8839129B2|2008-12-19|2014-09-16|T-Mobile Usa, Inc.|User interface for a communication device| US9152292B2|2009-02-05|2015-10-06|Hewlett-Packard Development Company, L.P.|Image collage authoring| KR101613838B1|2009-05-19|2016-05-02|삼성전자주식회사|Home Screen Display Method And Apparatus For Portable Device| CN102473043B|2009-07-30|2014-11-26|夏普株式会社|Portable display device, and method of controlling portable display device| US9092115B2|2009-09-23|2015-07-28|Microsoft Technology Licensing, Llc|Computing system with visual clipboard| KR101646254B1|2009-10-09|2016-08-05|엘지전자 주식회사|Method for removing icon in mobile terminal and mobile terminal using the same| US9170708B2|2010-04-07|2015-10-27|Apple Inc.|Device, method, and graphical user interface for managing folders| US8661369B2|2010-06-17|2014-02-25|Lg Electronics Inc.|Mobile terminal and method of controlling the same| WO2012039441A1|2010-09-24|2012-03-29|シャープ株式会社|Content display device, content display method, portable terminal, program, and recording medium| JP5676996B2|2010-09-27|2015-02-25|キヤノン株式会社|Layout system, information processing apparatus, layout method, and program| US9001055B2|2010-11-26|2015-04-07|Htc Corporation|Portable device and method for operating portable device| KR101788051B1|2011-01-04|2017-10-19|엘지전자 주식회사|Mobile terminal and method for controlling thereof| US10042546B2|2011-01-07|2018-08-07|Qualcomm Incorporated|Systems and methods to present multiple frames on a touch screen| KR101892630B1|2011-01-10|2018-08-28|삼성전자주식회사|Touch display apparatus and method for displaying thereof| JP4951128B1|2011-01-25|2012-06-13|株式会社エヌ・ティ・ティ・ドコモ|Terminal device and icon management method| KR102014273B1|2011-02-10|2019-08-26|삼성전자주식회사|Portable device having touch screen display and method for controlling thereof| KR102075082B1|2011-06-20|2020-02-10|삼성전자주식회사|Apparatus having a touch screen and method for changing display of apparatus| US20120206771A1|2011-02-11|2012-08-16|Cok Ronald S|Imaging product layout method| KR101891803B1|2011-05-23|2018-08-27|삼성전자주식회사|Method and apparatus for editing screen of mobile terminal comprising touch screen| KR101380968B1|2011-06-02|2014-04-04|주식회사 팬택|Apparatus and method for providing graphic user interface| US10083047B2|2011-06-14|2018-09-25|Samsung Electronics Co., Ltd.|System and method for executing multiple tasks in a mobile device| KR20130023954A|2011-08-30|2013-03-08|삼성전자주식회사|Apparatus and method for changing icon in portable terminal| US8723824B2|2011-09-27|2014-05-13|Apple Inc.|Electronic devices with sidewall displays| US20130117698A1|2011-10-31|2013-05-09|Samsung Electronics Co., Ltd.|Display apparatus and method thereof| JP6159078B2|2011-11-28|2017-07-05|京セラ株式会社|Apparatus, method, and program| KR102148717B1|2011-12-05|2020-08-28|삼성전자주식회사|Method for controlling display in a portable terminal and apparatus thereof| KR20130064458A|2011-12-08|2013-06-18|삼성전자주식회사|Display apparatus for displaying screen divided by a plurallity of area and method thereof| KR20130064478A|2011-12-08|2013-06-18|삼성전자주식회사|User terminal device and method for displaying background screen thereof| US9500485B2|2012-02-16|2016-11-22|Furuno Electric Co., Ltd.|Device and method for displaying information| US9285980B2|2012-03-19|2016-03-15|Htc Corporation|Method, apparatus and computer program product for operating items with multiple fingers| CN103324434B|2012-03-23|2018-05-22|宏达国际电子股份有限公司|Hand-held device and its homescreen management method| KR101515623B1|2012-05-14|2015-04-28|삼성전자주식회사|Method and apparatus for operating functions of portable terminal having bended display| US20140013271A1|2012-07-05|2014-01-09|Research In Motion Limited|Prioritization of multitasking applications in a mobile device interface| US9665178B2|2012-08-01|2017-05-30|Blackberry Limited|Selective inbox access in homescreen mode on a mobile electronic device| KR101417318B1|2012-08-17|2014-07-09|주식회사 팬택|Method for providing User Interface having multi-tasking function, Mobile Communication Device and Computer Readable Recording Medium for providing the same| KR20140024721A|2012-08-21|2014-03-03|삼성전자주식회사|Method for changing display range and an electronic device thereof| KR20140025941A|2012-08-23|2014-03-05|삼성전자주식회사|Apparatas and method for merging and sharing a image in an electronic device| CN103677498A|2012-09-11|2014-03-26|华为终端有限公司|Icon sorting and displaying method and terminal equipment| KR101957173B1|2012-09-24|2019-03-12|삼성전자 주식회사|Method and apparatus for providing multi-window at a touch device| KR102069014B1|2012-09-25|2020-02-12|삼성전자 주식회사|Apparatus and method for controlling split view in portable terminal equipment| KR102083918B1|2012-10-10|2020-03-04|삼성전자주식회사|Multi display apparatus and method for contorlling thereof| KR20140055133A|2012-10-30|2014-05-09|삼성전자주식회사|User terminal apparatus and control method thereof| US9250793B2|2012-12-21|2016-02-02|Sap Se|Interface management systems and methods| US9019223B2|2013-03-13|2015-04-28|Adobe Systems Incorporated|Touch input layout configuration| US20140351722A1|2013-05-23|2014-11-27|Microsoft|User interface elements for multiple displays| KR101584590B1|2013-07-11|2016-01-13|삼성전자주식회사| user terminal device for displaying application and methods thereof | US9921714B1|2013-07-22|2018-03-20|Rockwell Collins, Inc.|Graphical method to select formats| JP5905417B2|2013-07-29|2016-04-20|京セラ株式会社|Mobile terminal and display control method| KR102202899B1|2013-09-02|2021-01-14|삼성전자 주식회사|Method and apparatus for providing multiple applications| US10817525B2|2013-12-13|2020-10-27|impulseGUIDE.com|Method for displaying customized compilation media items on an electronic display device| KR102265244B1|2014-06-20|2021-06-15|삼성전자주식회사|Electronic device and method for controlling display|KR20170086728A|2016-01-18|2017-07-27|삼성디스플레이 주식회사|Electronic device and method of manufacturing a curved touch-screen panel| KR20180061652A|2016-11-30|2018-06-08|엘지전자 주식회사|Mobile terminal and method for controlling the same| KR20180065722A|2016-12-08|2018-06-18|삼성전자주식회사|Electronic device comprising bended display and method for controlling the same| US11132101B2|2017-09-08|2021-09-28|Guangdong Oppo Mobile Telecommunications Corp., Ltd.|Icon moving method, terminal and storage medium| JP2020177345A|2019-04-16|2020-10-29|古野電気株式会社|Information display device, information display method, and program| CN111324249B|2020-01-21|2020-12-01|北京达佳互联信息技术有限公司|Multimedia material generation method and device and storage medium|
法律状态:
2016-05-30| PLFP| Fee payment|Year of fee payment: 2 | 2017-05-30| PLFP| Fee payment|Year of fee payment: 3 | 2018-03-02| PLSC| Search report ready|Effective date: 20180302 | 2018-05-29| PLFP| Fee payment|Year of fee payment: 4 | 2019-04-10| PLFP| Fee payment|Year of fee payment: 5 | 2020-04-08| PLFP| Fee payment|Year of fee payment: 6 | 2021-04-09| PLFP| Fee payment|Year of fee payment: 7 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 KR1020140073813A|KR102218041B1|2014-06-17|2014-06-17|Mobile terminal| KR1020140073813|2014-06-17| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|